A bilinear formulation for vector sparsity optimization

نویسندگان

  • Dori Peleg
  • Ron Meir
چکیده

Sparsity plays an important role in many fields of engineering. The cardinality penalty function, often used as a measure of sparsity, is neither continuous nor differentiable and therefore smooth optimization algorithms cannot be applied directly. In this paper we present a continuous yet non-differentiable sparsity function which constitutes a tight lower bound on the cardinality function. The novelty of this approach is that we cast the problem of minimizing the new sparsity function as a problem with a bilinear objective function. We present a numerical comparison to other sparsity encouraging penalty functions for several applications. Additionally, we apply the techniques developed to minimize an objective function with a truncated hinge loss function. We present highly competitive results for all of the applications.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A jointly constrained bilinear programming method for solving generalized Cournot-Pareto models

We propose a vector optimization approach to linear Cournot oligopolistic market equilibrium models where the strategy sets depend on each other. We use scalarization technique to find a Pareto efficient solution to the model by using a jointly constrained bilinear programming formulation. We then propose a decomposition branch-and-bound algorithm for globally solving the resulting bilinear pro...

متن کامل

Multi-instance Support Vector Machine Based on Convex Combination∗

This paper presents a new formulation of multi-instance learning as maximum margin problem, which is an extension of the standard C-support vector classification. For linear classification, this extension leads to, instead of a mixed integer quadratic programming, a continuous optimization problem, where the objective function is convex quadratic and the constraints are either linear or bilinea...

متن کامل

An efficient formulation of sparsity controlled support vector regression

Support Vector Regression (SVR) is a kernel based regression method capable of implementing a variety of regularisation techniques. Implementation of SVR usually follows a dual optimisation technique which includes Vapnik's -insensitive zone. The number of terms in the resulting SVR approximation function is dependent on the size of this zone, but improving sparsity by increasing the size of th...

متن کامل

Message passing-based joint CFO and channel estimation in millimeter wave systems with one-bit ADCs

Channel estimation at millimeter wave (mmWave) is challenging when large antenna arrays are used. Prior work has leveraged the sparse nature of mmWave channels via compressed sensing based algorithms for channel estimation. Most of these algorithms, though, assume perfect synchronization and are vulnerable to phase errors that arise due to carrier frequency offset (CFO) and phase noise. Recentl...

متن کامل

Recognizing underlying sparsity in optimization

Exploiting sparsity is essential to improve the efficiency of solving large optimization problems. We present a method for recognizing the underlying sparsity structure of a nonlinear partially separable problem, and show how the sparsity of the Hessian matrices of the problem’s functions can be improved by performing a nonsingular linear transformation in the space corresponding to the vector ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • Signal Processing

دوره 88  شماره 

صفحات  -

تاریخ انتشار 2008